SA - Prop : Optimization of Multilayer PerceptronParameters using Simulated

نویسندگان

  • A. Prieto
  • V. Rivas
چکیده

A general problem in model selection is to obtain the right parameters that make a model t observed data. If the model selected is a Multilayer Perceptron (MLP) trained with Backpropagation (BP), it is necessary to nd appropriate initial weights and learning parameters. This paper proposes a method that combines Simulated Annealing (SimAnn) and BP to train MLPs with a single hidden layer, termed SA-Prop. SimAnn selects the initial weights and the learning rate of the network. SA-Prop combines the advantages of the stochastic search performed by the Simulated Annealing over the MLP parameter space and the local search of the BP algorithm. The application of the proposed method to several real-world benchmark problems shows that MLPs evolved using SA-Prop achieve a higher level of generalization than other perceptron training algorithms, such as QuickPropagation (QP) or RPROP, and other evolutive algorithms, such as G-LVQ.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Classification of Breast Masses in Mammograms Using Radial Basis Functions and Simulated Annealing

We present pattern classification methods based upon nonlinear and combinational optimization techniques, specifically, radial basis functions (RBF) and simulated annealing (SA), to classify masses in mammograms as malignant or benign. Combinational optimization is used to pre-estimate RBF parameters, namely, the centers and spread matrix. The classifier was trained and tested, using the leave-...

متن کامل

G-Prop: Global optimization of multilayer perceptrons using GAs

A general problem in model selection is to obtain the right parameters that make a model "t observed data. For a multilayer perceptron (MLP) trained with back-propagation (BP), this means "nding appropiate layer size and initial weights. This paper proposes a method (G-Prop, genetic backpropagation) that attempts to solve that problem by combining a genetic algorithm (GA) and BP to train MLPs w...

متن کامل

G-Prop-II: Global Optimization of Multilayer Perceptrons using GAs

A general problem in model selection is to obtain the right parameters that make a model fit observed data. For a Multilayer Perceptron (MLP) trained with Backpropagation (BP), this means finding the right hidden layer size, appropriate initial weights and learning parameters. This paper proposes a method (G-Prop-II) that attempts to solve that problem by combining a genetic algorithm (GA) and ...

متن کامل

Using design of experiments approach and simulated annealing algorithm for modeling and Optimization of EDM process parameters

The main objectives of this research are, therefore, to assess the effects of process parameters and to determine their optimal levels machining of Inconel 718 super alloy. gap voltage, current, time of machining and duty factor are tuning parameters considered to be study as process input parameters. Furthermore, two important process output characteristic, have been evaluated in this research...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998